Imagine that you have been asked to produce a report documenting the following for an API endpoint across a specified time range:

  • Total number of API hits (per hour)
  • Number of successful API hits (per hour)
  • Number of unsuccessful API hits (per hour)
  • Status of the endpoint point at the end of each hour (endpoint is UP or DOWN)

We could run a simple search such as this:

source="test_API_data.csv" host="test_API_data" index="main" sourcetype="csv"
| table _time, total_number_of_hits, successful_hit, unsuccessful_hits, status
| sort + _time

And we would get a table of results such as this:

However, there is a problem. We can see that there are certain hours of the day that don’t feature in this data – 19/01/2020 13:00 and 19/01/2020 18:00 for example don’t appear. There was no activity on this endpoint at these times so there is no data for Splunk to return.

We would like to still include these times in our report. If instead of a table we now use ‘timechart’ with a span of 1 hour:

source="test_API_data.csv" host="test_API_data" index="main" sourcetype="csv"
| timechart values(total_number_of_hits) as total_number_of_hits, values(successful_hits) as successful_hits,values(unsuccessful_hits) as unsuccessful_hits,values(status) as status span=1hr
| sort + _time

We now get:

We now have placeholders for the hours in which we observed no activity but we wish to populate these some kind of information.

For the hours where no activity was observed, total_number_of_hits, successful_hits and unsuccessful_hits should all be zero rather than null. We can rectify this with the “fillnull” command which allows us to repalce the null values of specific fields with zeroes instead.

If we use fillnull like this:

source="test_API_data.csv" host="test_API_data" index="main" sourcetype="csv"
| timechart values(total_number_of_hits) as total_number_of_hits, values(successful_hits) as successful_hits,values(unsuccessful_hits) as unsuccessful_hits,values(status) as status span=1hr
| sort + _time
| fillnull total_number_of_hits, successful_hits, unsuccessful_hits

We then get an updated table that looks like this:

We can see that the “missing hours” now have rows of zeroes which tells us that there were no activity during these hours rather than ambiguously not including them.

However the “status” column is still empty for these missing hours. The status is the state of the API endpoint at the end of each hour. If the API was available at the end of the hour then the status is reported as UP and conversely, if the API was unavailable then the status is reported as DOWN.

Since there were no hits during the missing hours, there is nothing to tell us whether our API endpoint was available or not.

We might reasonably assume that for each missing hour, the API status is the same as that of the previous hour. Using this assumption we can use Splunk’s “filldown” command, to fill in the missing values.

Filldown looks for empty values for a particular field and updates them to be that of the last known, non-empty value for that field.

Looking at the table we can see that for the row for 19/01/2020 01:00, the last known value for status was UP (which comes from the 19/01/2020 00:00). Similarly the row for 19/01/2020 03:00 the last known value for the status was DOWN (which comes from the 19/01/2020 02:00).

If we now add the filldown command to the search:

source="test_API_data.csv" host="test_API_data" index="main" sourcetype="csv"
| timechart values(total_number_of_hits) as total_number_of_hits, values(successful_hits) as successful_hits,values(unsuccessful_hits) as unsuccessful_hits,values(status) as status span=1hr
| sort + _time
| fillnull total_number_of_hits, successful_hits, unsuccessful_hits
| filldown status

We get a final table of:

All hours for the period in question are represented in the table and all columns are populated with some piece of information.


For 2021 we’ve committed to posting a new Splunk tip every week!

If you want to keep up to date on tips like the one above then sign up below:

Subscribe to our newsletter to receive regular updates from iDelta, including news and updates, information on upcoming events, and Splunk tips and tricks from our team of experts. You can also find us on Twitter and LinkedIn.

Subscribe

* indicates required

Posted by:Andrew MacLeod

Andrew is a certified Splunk Admin and has worked for iDelta for over two years. Previously, he worked as an actuarial analyst in the life and pensions industry - a role that he was in for over 7 years before deciding to embark on a career change into the IT industry. He holds an MPhys degree in theoretical physics from the University of Edinburgh. Outside of work he is a big puzzle fan, with a particular penchant for things cruciverbal and mathematical.